Path integral based convolution and pooling for graph neural networks*

نویسندگان

چکیده

Graph neural networks (GNNs) extends the functionality of traditional to graph-structured data. Similar CNNs, an optimized design graph convolution and pooling is key success. Borrowing ideas from physics, we propose a path integral based (PAN) for classification regression tasks on graphs. Specifically, consider operation that involves every linking message sender receiver with learnable weights depending length, which corresponds maximal entropy random walk. It generalizes Laplacian new transition matrix call (MET) derived formalism. Importantly, diagonal entries MET are directly related subgraph centrality, thus providing natural adaptive mechanism. PAN provides versatile framework can be tailored different data varying sizes structures. We view most existing GNN architectures as special cases PAN. Experimental results show achieves state-of-the-art performance various classification/regression tasks, including benchmark dataset statistical mechanics boost applications in physical sciences.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Flip-Rotate-Pooling Convolution and Split Dropout on Convolution Neural Networks for Image Classification

This paper presents a new version of Dropout called Split Dropout (sDropout) and rotational convolution techniques to improve CNNs’ performance on image classification. The widely used standard Dropout has advantage of preventing deep neural networks from overfitting by randomly dropping units during training. Our sDropout randomly splits the data into two subsets and keeps both rather than dis...

متن کامل

Pricing Derivatives by Path Integral and Neural Networks

Recent progress in the development of efficient computational algorithms to price financial derivatives is summarized. A first algorithm is based on a path integral approach to option pricing, while a second algorithm makes use of a neural network parameterization of option prices. The accuracy of the two methods is established from comparisons with the results of the standard procedures used i...

متن کامل

Temporal Pyramid Pooling Based Convolutional Neural Networks for Action Recognition

Encouraged by the success of Convolutional Neural Networks (CNNs) in image classification, recently much effort is spent on applying CNNs to video based action recognition problems. One challenge is that video contains a varying number of frames which is incompatible to the standard input format of CNNs. Existing methods handle this issue either by directly sampling a fixed number of frames or ...

متن کامل

Multipartite Pooling for Deep Convolutional Neural Networks

We propose a novel pooling strategy that learns how to adaptively rank deep convolutional features for selecting more informative representations. To this end, we exploit discriminative analysis to project the features onto a space spanned by the number of classes in the dataset under study. This maps the notion of labels in the feature space into instances in the projected space. We employ the...

متن کامل

Low-memory GEMM-based convolution algorithms for deep neural networks

Deep neural networks (DNNs) require very large amounts of computation both for training and for inference when deployed in the field. A common approach to implementing DNNs is to recast the most computationally expensive operations as general matrix multiplication (GEMM). However, as we demonstrate in this paper, there are a great many different ways to express DNN convolution operations using ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Statistical Mechanics: Theory and Experiment

سال: 2021

ISSN: ['1742-5468']

DOI: https://doi.org/10.1088/1742-5468/ac3ae4